Dimensionality reduction based on ICA for regression problems
نویسندگان
چکیده
منابع مشابه
Dimensionality Reduction Based on ICA for Regression Problems
In manipulating data such as in supervised learning, we often extract new features from the original input variables for the purpose of reducing the dimensions of input space and achieving better performances. In this paper, we show how standard algorithms for independent component analysis (ICA) can be extended to extract attributes for regression problems. The advantage is that general ICA al...
متن کاملRandom Projections for Dimensionality Reduction in ICA
In this paper we present a technique to speed up ICA based on the idea of reducing the dimensionality of the data set preserving the quality of the results. In particular we refer to FastICA algorithm which uses the Kurtosis as statistical property to be maximized. By performing a particular Johnson-Lindenstrauss like projection of the data set, we find the minimum dimensionality reduction rate...
متن کاملSpectral Regression for Dimensionality Reduction∗
Spectral methods have recently emerged as a powerful tool for dimensionality reduction and manifold learning. These methods use information contained in the eigenvectors of a data affinity (i.e., item-item similarity) matrix to reveal low dimensional structure in high dimensional data. The most popular manifold learning algorithms include Locally Linear Embedding, Isomap, and Laplacian Eigenmap...
متن کاملNonlinear Dimensionality Reduction for Regression
The task of dimensionality reduction for regression (DRR) is to find a low dimensional representation z ∈ R of the input covariates x ∈ R, with q p, for regressing the output y ∈ R. DRR can be beneficial for visualization of high dimensional data, efficient regressor design with a reduced input dimension, but also when eliminating noise in data x through uncovering the essential information z f...
متن کاملA Grassmann-Rayleigh Quotient Iteration for Dimensionality Reduction in ICA
We derive a Grassmann-Rayleigh Quotient Iteration for the computation of the best rank-(R1, R2, R3) approximation of higher-order tensors. We present some variants that allow for a very efficient estimation of the signal subspace in ICA schemes without prewhitening.
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Neurocomputing
سال: 2008
ISSN: 0925-2312
DOI: 10.1016/j.neucom.2007.11.036